Verifying global minima for L2 minimization problems

نویسندگان

  • Richard I. Hartley
  • Yongduek Seo
چکیده

We consider the least-squares (L2) triangulation problem and structure-and-motion with known rotatation, or known plane. Although optimal algorithms have been given for these algorithms under an L-infinity cost function, finding optimal least-squares (L2) solutions to these problems is difficult, since the cost functions are not convex, and in the worst case can have multiple minima. Iterative methods can usually be used to find a good solution, but this may be a local minimum. This paper provides a method for verifying whether a local-minimum solution is globally optimal, by providing a simple and rapid test involving the Hessian of the cost function. In tests of a data set involving 277,000 independent triangulation problems, it is shown that the test verifies the global optimality of an iterative solution in over 99.9% of the cases.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Minimization of ℓ1-2 for Compressed Sensing

We study minimization of the difference of l1 and l2 norms as a non-convex and Lipschitz continuous metric for solving constrained and unconstrained compressed sensing problems. We establish exact (stable) sparse recovery results under a restricted isometry property (RIP) condition for the constrained problem, and a full-rank theorem of the sensing matrix restricted to the support of the sparse...

متن کامل

Global optimization of factor models using alternating minimization

Learning new representations in machine learning is often tackled using a factorization of the data. For many such problems, including sparse coding and matrix completion, learning these factorizations can be difficult, in terms of efficiency and to guarantee that the solution is a global minimum. Recently, a general class of objectives have been introduced, called induced regularized factor mo...

متن کامل

Global optimization of factor models and dictionary learning using alternating minimization

Learning new representations in machine learning is often tackled using a factorization of the data. For many such problems, including sparse coding and matrix completion, learning these factorizations can be difficult, in terms of efficiency and to guarantee that the solution is a global minimum. Recently, a general class of objectives have been introduced, called induced regularized factor mo...

متن کامل

Simulated Annealing with Asymptotic Convergence for Nonlinear Constrained Global Optimization ? 1 Problem Deenition

In this paper, we present constrained simulated annealing (CSA), a global minimization algorithm that converges to constrained global minima with probability one, for solving nonlinear discrete non-convex constrained minimization problems. The algorithm is based on the necessary and suucient condition for constrained local minima in the theory of discrete Lagrange multipliers we developed earli...

متن کامل

Simulated Annealing with Asymptotic Convergence for Nonlinear Constrained Global Optimization

In this paper, we present constrained simulated annealing (CSA), a global minimization algorithm that converges to constrained global minima with probability one, for solving nonlinear discrete nonconvex constrained minimization problems. The algorithm is based on the necessary and sufficient condition for constrained local minima in the theory of discrete Lagrange multipliers we developed earl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008